2,125 research outputs found

    Tool use induces complex and flexible plasticity of human body representations

    Get PDF
    Plasticity of body representation fundamentally underpins human tool use. Recent studies have demonstrated remarkably complex plasticity of body representation in humans, showing that such plasticity: (1) occurs flexibly across multiple time-scales, and (2) involves multiple body representations responding differently to tool use. Such findings reveal remarkable sophistication of body plasticity in humans, suggesting that Vaesen may overestimate the similarity of such mechanisms in humans and non-human primates

    Sense of agency primes manual motor responses

    Get PDF
    Perceiving the body influences how we perceive and respond to stimuli in the world. We investigated the respective effects of different components of bodily representation - the senses of ownership and agency - on responses to simple visual stimuli. Participants viewed a video image of their hand on a computer monitor presented either in real time, or with a systematic delay. Blocks began with an induction period in which the index finger was (i) brushed, (ii) passively moved, or (iii) actively moved by the participant. Subjective reports showed that the sense of ownership over the seen hand emerged with synchronous video, regardless of the type of induction, whereas the sense of agency over the hand emerged only following synchronous video with active movement. Following induction, participants responded as quickly as possible to the onset of visual stimuli near the hand by pressing a button with their other hand. Reaction time was significantly speeded when participants had a sense of agency over their seen hand. This effect was eliminated when participants responded vocally, suggesting that it reflects priming of manual responses, rather than enhanced stimulus detection. These results suggest that vision of one's own hand and, specifically, the sense of agency over that hand primes manual motor responses

    What is it like to have a body?

    Get PDF
    Few questions in psychology are as fundamental or as elusive as the sense of one’s own body. Despite widespread recognition of the link between body and self, psychology has only recently developed methods for the scientific study of bodily awareness. Experimental manipulations of embodiment in healthy volunteers have allowed important advances in knowledge. Synchronous multisensory inputs from different modalities play a fundamental role in producing ‘body ownership’, the feeling that my body is ‘mine’. Indeed, appropriate multisensory stimulation can induce ownership over external objects, virtual avatars, and even other people’s bodies. We argue that bodily experience is not monolithic, but has measurable internal structure and components that can be identified psychometrically and psychophysically, suggesting the apparent phenomenal unity of self-consciousness may be illusory. We further review evidence that the sense of one’s own body is highly plastic, with representations of body structure and size particularly prone to multisensory influences

    A 2.5-D representation of the human hand

    Get PDF
    Primary somatosensory maps in the brain represent the body as a discontinuous, fragmented set of 2-D skin regions. We nevertheless experience our body as a coherent 3-D volumetric object. The links between these different aspects of body representation, however, remain poorly understood. Perceiving the body’s location in external space requires that immediate afferent signals from the periphery be combined with stored representations of body size and shape. At least for the back of the hand, this body representation is massively distorted, in a highly stereotyped manner. Here we test whether a common pattern of distortions applies to the entire hand as a 3-D object, or whether each 2-D skin surface has its own characteristic pattern of distortion. Participants judged the location in external space of landmark points on the dorsal and palmar surfaces of the hand. By analyzing the internal configuration of judgments, we produced implicit maps of each skin surface. Qualitatively similar distortions were observed in both cases. The distortions were correlated across participants, suggesting that the two surfaces are bound into a common underlying representation. The magnitude of distortion, however, was substantially smaller on the palmar surface, suggesting that this binding is incomplete. The implicit representation of the human hand may be a hybrid, intermediate between a 2-D representation of individual skin surfaces and a 3-D representation of the hand as a volumetric object

    Attention modulates the specificity of automatic imitation to human actors

    Get PDF
    The perception of actions performed by others activates one’s own motor system. Recent studies disagree as to whether this effect is specific to actions performed by other humans, an issue complicated by differences in perceptual salience between human and non-human stimuli. We addressed this issue by examining the automatic imitation of actions stimulated by viewing a virtual, computer generated, hand. This stimulus was held constant across conditions, but participants’ attention to the virtualness of the hand was manipulated by informing some participants during instructions that they would see a “computer-generated model of a hand,” while making no mention of this to others. In spite of this attentional manipulation, participants in both conditions were generally aware of the virtualness of the hand. Nevertheless, automatic imitation of the virtual hand was significantly reduced––but not eliminated––when participants were told they would see a virtual hand. These results demonstrate that attention modulates the “human bias” of automatic imitation to non-human actors

    Merging second-person and first-person neuroscience

    Get PDF
    Schilbach et al. contrast second-person and third-person approaches to social neuroscience. We discuss relations between second-person and first-person approaches, arguing that they cannot be studied in isolation. Contingency is central for converging first- and second-person approaches. Studies of embodiment show how contingencies scaffold first-person perspective and how the transition from a third- to a second-person perspective fundamentally involves first-person contributions

    Automatic imitation of biomechanically possible and impossible actions: effects of priming movements versus goals

    Get PDF
    Recent behavioral, neuroimaging, and neurophysiological research suggests a common representational code mediating the observation and execution of actions; yet, the nature of this representational code is not well understood. The authors address this question by investigating (a) whether this observation execution matching system (or mirror system) codes both the constituent movements of an action as well as its goal and (b) how such sensitivity is influenced by top-down effects of instructions. The authors tested the automatic imitation of observed finger actions while manipulating whether the movements were biomechanically possible or impossible, but holding the goal constant. When no mention was made of this difference (Experiment 1), comparable automatic imitation was elicited from possible and impossible actions, suggesting that the actions had been coded at the level of the goal. When attention was drawn to this difference (Experiment 2), however, only possible movements elicited automatic imitation. This sensitivity was specific to imitation, not affecting spatial stimulus–response compatibility (Experiment 3). These results suggest that automatic imitation is modulated by top-down influences, coding actions in terms of both movements and goals depending on the focus of attention

    More than skin deep: body representation beyond primary somatosensory cortex

    Get PDF
    The neural circuits underlying initial sensory processing of somatic information are relatively well understood. In contrast, the processes that go beyond primary somatosensation to create more abstract representations related to the body are less clear. In this review, we focus on two classes of higher-order processing beyond somatosensation. Somatoperception refers to the process of perceiving the body itself, and particularly of ensuring somatic perceptual constancy. We review three key elements of somatoperception: (a) remapping information from the body surface into an egocentric reference frame (b) exteroceptive perception of objects in the external world through their contact with the body and (c) interoceptive percepts about the nature and state of the body itself. Somatorepresentation, in contrast, refers to the essentially cognitive process of constructing semantic knowledge and attitudes about the body, including: (d) lexical-semantic knowledge about bodies generally and one’s own body specifically, (e) configural knowledge about the structure of bodies, (f) emotions and attitudes directed towards one’s own body, and (g) the link between physical body and psychological self. We review a wide range of neuropsychological, neuroimaging and neurophysiological data to explore the dissociation between these different aspects of higher somatosensory function

    Expansion of perceptual body maps near – but not across – the wrist

    Get PDF
    Perceiving the external spatial location of touch requires that tactile information about the stimulus location on the skin be integrated with proprioceptive information about the location of the body in external space, a process called tactile spatial remapping. Recent results have suggested that this process relies on a distorted representation of the hand. Here, I investigated whether similar distortions are also found on the forearm and how they are affected by the presence of the wrist joint, which forms a categorical, segmental boundary between the hand and the arm. Participants used a baton to judge the perceived location of touches applied to their left hand or forearm. Similar distortions were apparent on both body parts, with overestimation of distances in the medio-lateral axis compared to the proximo-distal axis. There was no perceptual expansion of distances that crossed the wrist boundary. However, there was increased overestimation of distances near the wrist in the medio-lateral orientation. These results replicate recent findings of a distorted representation of the hand underlying tactile spatial remapping, and show that this effect is not idiosyncratic to the hand, but also affects the forearm. These distortions may be a general characteristic of the mental representation of the arms

    Hand posture modulates perceived tactile distance

    Get PDF
    A growing literature shows that body posture modulates the perception of touch, as well as somatosensory processing more widely. In this study, I investigated the effects of changes in the internal postural configuration of the hand on the perceived distance between touches. In two experiments participants positioned their hand in two postures, with the fingers splayed (Apart posture) or pressed together (Together posture). In Experiment 1, participants made forced-choice judgments of which of two tactile distances felt bigger, one oriented with the proximal-distal hand axis (Along orientation) and one oriented with the medio-lateral hand axis (Across orientation). In Experiment 2, participants made verbal estimates of the absolute distance between a single pair of touches, in one of the two orientations. Consistent with previous results, there was a clear bias to perceive distances in the across orientation as larger than those in the along orientation. Perceived tactile distance was also modulated by posture, with increased judgments in both orientations when the fingers were splayed. These results show that changes in the internal posture of the hand modulate the perceived distance between touches on the hand, and add to a growing literature showing postural modulation of touch
    corecore